Skip to content

fix: use getModel() temperature as single source of truth in OpenAI handler#12144

Draft
roomote-v0[bot] wants to merge 1 commit intomainfrom
fix/openai-temperature-from-getmodel
Draft

fix: use getModel() temperature as single source of truth in OpenAI handler#12144
roomote-v0[bot] wants to merge 1 commit intomainfrom
fix/openai-temperature-from-getmodel

Conversation

@roomote-v0
Copy link
Copy Markdown
Contributor

@roomote-v0 roomote-v0 bot commented Apr 17, 2026

Related GitHub Issue

Closes: #12142

Description

This PR attempts to address Issue #12142. Feedback and guidance are welcome.

Root cause: The OpenAiHandler.createMessage() computed temperature independently from getModel(), ignoring defaultTemperature from custom model info. When using the OpenAI Compatible provider with models like kimi-for-coding (which only accepts temperature: 0.6), Roo Code sent temperature: 0 (the hardcoded default), resulting in a 400 error.

Fix: Make getModelParams() the single source of truth for temperature across all code paths:

  1. getModel() now detects deepseek reasoner models and passes DEEP_SEEK_DEFAULT_TEMPERATURE as defaultTemperature to getModelParams() (matching how openrouter.ts handles this).
  2. createMessage() streaming path uses the temperature returned by getModel() instead of computing it separately.
  3. createMessage() non-streaming path now includes temperature from getModel() (was previously missing entirely).
  4. completePrompt() now includes temperature from getModel() (was previously missing).

Temperature priority is now consistent everywhere: user override > model defaultTemperature > provider default (0).

After this fix, users can set defaultTemperature: 0.6 in their custom model info for kimi-for-coding and it will be respected.

Test Procedure

  • Added 5 new tests covering temperature handling:
    • defaultTemperature from custom model info is sent when no user temperature is set
    • User-set modelTemperature takes priority over defaultTemperature
    • Default of 0 when neither is set
    • Non-streaming path includes temperature
    • completePrompt() includes temperature
  • Updated 3 existing tests to reflect temperature now being included in non-streaming and completePrompt paths
  • All 53 tests pass, lint passes, type-check passes

Pre-Submission Checklist

  • Issue Linked: This PR is linked to an approved GitHub Issue (see "Related GitHub Issue" above).
  • Scope: My changes are focused on the linked issue (one major feature/fix per PR).
  • Self-Review: I have performed a thorough self-review of my code.
  • Testing: New and/or updated tests have been added to cover my changes.
  • Documentation Impact: No documentation updates required (internal behavior fix).
  • Contribution Guidelines: I have read and agree to the Contributor Guidelines.

Documentation Updates

No documentation updates needed -- this is an internal bug fix that makes existing configuration options work as documented.

Interactively review PR in Roo Code Cloud

…andler

The OpenAiHandler computed temperature independently in createMessage(),
ignoring defaultTemperature from custom model info. This caused a 400
error with models like kimi-for-coding that only accept specific
temperature values (e.g. 0.6).

Changes:
- getModel() now detects deepseek reasoner models and passes the correct
  defaultTemperature to getModelParams()
- createMessage() streaming path uses temperature from getModel() instead
  of computing it separately
- createMessage() non-streaming path now includes temperature (was missing)
- completePrompt() now includes temperature (was missing)

This makes getModelParams() the single source of truth for temperature,
which respects: user overrides > model defaultTemperature > provider default.

Closes #12142
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG]

1 participant